9 research outputs found

    High Resolution Schemes for Conservation Laws With Source Terms.

    Get PDF
    This memoir is devoted to the study of the numerical treatment of source terms in hyperbolic conservation laws and systems. In particular, we study two types of situations that are particularly delicate from the point of view of their numerical approximation: The case of balance laws, with the shallow water system as the main example, and the case of hyperbolic equations with stiff source terms. In this work, we concentrate on the theoretical foundations of highresolution total variation diminishing (TVD) schemes for homogeneous scalar conservation laws, firmly established. We analyze the properties of a second order, flux-limited version of the Lax-Wendroff scheme which avoids oscillations around discontinuities, while preserving steady states. When applied to homogeneous conservation laws, TVD schemes prevent an increase in the total variation of the numerical solution, hence guaranteeing the absence of numerically generated oscillations. They are successfully implemented in the form of flux-limiters or slope limiters for scalar conservation laws and systems. Our technique is based on a flux limiting procedure applied only to those terms related to the physical flow derivative/Jacobian. We also extend the technique developed by Chiavassa and Donat to hyperbolic conservation laws with source terms and apply the multilevel technique to the shallow water system. With respect to the numerical treatment of stiff source terms, we take the simple model problem considered by LeVeque and Yee. We study the properties of the numerical solution obtained with different numerical techniques. We are able to identify the delay factor, which is responsible for the anomalous speed of propagation of the numerical solution on coarse grids. The delay is due to the introduction of non equilibrium values through numerical dissipation, and can only be controlled by adequately reducing the spatial resolution of the simulation. Explicit schemes suffer from the same numerical pathology, even after reducing the time step so that the stability requirements imposed by the fastest scales are satisfied. We study the behavior of Implicit-Explicit (IMEX) numerical techniques, as a tool to obtain high resolution simulations that incorporate the stiff source term in an implicit, systematic, manner

    A multiscale method applied to shallow water flow

    Get PDF
    A flux-limited second order scheme with the C-property is used to solve the one dimensional or two dimensional Saint-Venant system for shallow water flows with non-flat bottom and friction terms, as is introduced in [7] G. Haro, Numerical simulation of shallow water equations amd some physical models in image processing. Ph.D.Thesis, Departament of Technologies, Universitat Pompeu Fabra, Barcelona, 2005. High resolution at low cost can be obtained by applying a point-value multiresolution transform [2, 3, 9] in order to detect regions with singularities. The above method is applied in these regions, while a cheap polynomial interpolation is used in the smooth zones, thus lowering the computational cost

    Adaptive multiple crossover genetic algorithm to solve workforce scheduling and routing problem

    Get PDF
    The Workforce Scheduling and Routing Problem refers to the assignment of personnel to visits, across various geographical locations. Solving this problem demands tackling numerous scheduling and routing constraints while aiming to minimise the operational cost. One of the main obstacles in designing a genetic algorithm for this problem is selecting the best set of operators that enable better performance in a Genetic Algorithm (GA). This paper presents an adaptive multiple crossover genetic algorithm to tackle the combined setting of scheduling and routing problems. A mix of problem-specific and traditional crossovers are evaluated by using an online learning process to measure the operator's effectiveness. Best performing operators are given high application rates and low rates are given to the worse performing ones. Application rates are dynamically adjusted according to the learning outcomes in a non-stationary environment. Experimental results show that the combined performances of all the operators works better than using one operator in isolation. This study makes a contribution to advance our understanding of how to make effective use of crossover operators on this highly-constrained optimisation problem

    Max–min dispersion with capacity and cost for a practical location problem

    Get PDF
    Diversity and dispersion problems deal with selecting a subset of elements from a given set in such a way that their diversity is maximized. This study considers a practical location problem recently proposed in the context of max–min dispersion models. It is called the generalized dispersion problem, and it models realistic applications by introducing capacity and cost constraints. We propose two effective linear formulations for this problem, and develop a hybrid metaheuristic algorithm based on the variable neighborhood search methodology, to solve real instances. Extensive numerical computational experiments are performed to compare our hybrid metaheuristic with the state-of-art heuristic, and with integer linear programming formulations (ILP). Results on public benchmark instances show the superiority of our proposal with respect to the previous algorithms. Our extensive experimentation reveals that ILP models are able to optimally solve medium-size instances with the Gurobi optimizer, although metaheuristics outperform ILP both in running time and quality in large-size instances

    Heuristics for the Constrained Incremental Graph Drawing Problem

    No full text
    Visualization of information is a relevant topic in Computer Science, where graphs have become a standard representation model, and graph drawing is now a well-established area. Within this context, edge crossing minimization is a widely studied problem given its importance in obtaining readable representations of graphs. In this paper, we focus on the so-called incremental graph drawing problem, in which we try to preserve the user's mental map when obtaining successive drawings of the same graph. In particular, we minimize the number of edge crossings while satisfying some constraints required to preserve the position of vertices with respect to previous drawings. We propose heuristic methods to obtain high-quality solutions to this optimization problem in the short computational times required for graph drawing applications. We also propose a mathematical programming formulation and obtain the optimal solution for small and medium instances. Our extensive experimentation shows the merit of our proposal with respect to both optimal solutions obtained with CPLEX and heuristic solutions obtained with LocalSolver, a well-known black-box solver in combinatorial optimization

    Localización bajo dos perspectivas enfrentadas: ¿Cercanía o reparto justo?

    No full text
    Trabajo presentado en: R-Evolucionando el transporte, XIV Congreso de Ingeniería del Transporte (CIT 2021), realizado en modalidad online los días 6, 7 y 8 de julio de 2021, organizado por la Universidad de BurgosEn este trabajo abordamos un problema de localización que, a pesar de su gran interés y aplicabilidad en situaciones reales, ha sido poco estudiado en la literatura. No obstante, cada vez más empresas se enfrentan a él a la hora de determinar la localización óptima de sus instalaciones. Generalmente, cuando queremos localizar un conjunto de instalaciones, ya sean estaciones de bicicletas, centros comerciales u hospitales, entre muchas otras, se intenta que estén lo más cerca posible de sus puntos de demandas, esto es, usuarios de la bicicleta, clientes o pacientes. Pero, a su vez, se desea también que la carga de trabajo (la demanda) esté repartida de forma homogénea. Así, los objetivos a optimizar serían: minimizar la mayor distancia entre las instalaciones y los puntos de demanda y balancear la carga de trabajo de las instalaciones que se localicen, entendiendo como carga de trabajo el número de puntos de demanda a los que una instalación presta servicio. Por tanto, se consideran dos objetivos que, habitualmente están en conflicto, esto es, si se trata de localizar instalaciones favoreciendo que los puntos de demanda estén lo más cerca posible de la instalación que le presta el servicio, podría conllevar que haya instalaciones que tengan una carga de trabajo más elevada que otras. Igualmente, si nos centramos en optimizar el balanceo de la carga de trabajo de las instalaciones, podría conllevar que los puntos de demanda estén más alejados. Debido a la complejidad del problema de optimización combinatoria bi-objetivo que sea plantea, los métodos exactos hacen que su resolución sea costosa o inviable. Esto nos lleva a proponer un algoritmo metaheurístico capaz de resolverlo rápidamente obteniéndose soluciones de gran calidad. Concretamente, se propone un algoritmo híbrido basado en Oscilación Estratégica combinado con Path Relinking capaz de ofrecer diferentes soluciones eficientes de gran calidad

    Lung Micrometastases Display ECM Depletion and Softening While Macrometastases Are 30-Fold Stiffer and Enriched in Fibronectin

    No full text
    Mechanical changes in tumors have long been linked to increased malignancy and therapy resistance and attributed to mechanical changes in the tumor extracellular matrix (ECM). However, to the best of our knowledge, there have been no mechanical studies on decellularized tumors. Here, we studied the biochemical and mechanical progression of the tumor ECM in two models of lung metastases: lung carcinoma (CAR) and melanoma (MEL). We decellularized the metastatic lung sections, measured the micromechanics of the tumor ECM, and stained the sections for ECM proteins, proliferation, and cell death markers. The same methodology was applied to MEL mice treated with the clinically approved anti-fibrotic drug nintedanib. When compared to healthy ECM (~0.40 kPa), CAR and MEL lung macrometastases produced a highly dense and stiff ECM (1.79 ± 1.32 kPa, CAR and 6.39 ± 3.37 kPa, MEL). Fibronectin was overexpressed from the early stages (~118%) to developed macrometastases (~260%) in both models. Surprisingly, nintedanib caused a 4-fold increase in ECM-occupied tumor area (5.1 ± 1.6% to 18.6 ± 8.9%) and a 2-fold in-crease in ECM stiffness (6.39 ± 3.37 kPa to 12.35 ± 5.74 kPa). This increase in stiffness strongly correlated with an increase in necrosis, which reveals a potential link between tumor hypoxia and ECM deposition and stiffness. Our findings highlight fibronectin and tumor ECM mechanics as attractive targets in cancer therapy and support the need to identify new anti-fibrotic drugs to abrogate aberrant ECM mechanics in metastases
    corecore